321 research outputs found

    Asymptotic Properties of Minimum S-Divergence Estimator for Discrete Models

    Full text link
    Robust inference based on the minimization of statistical divergences has proved to be a useful alternative to the classical techniques based on maximum likelihood and related methods. Recently Ghosh et al. (2013) proposed a general class of divergence measures, namely the S-Divergence Family and discussed its usefulness in robust parametric estimation through some numerical illustrations. In this present paper, we develop the asymptotic properties of the proposed minimum S-Divergence estimators under discrete models.Comment: Under review, 24 page

    The Minimum S-Divergence Estimator under Continuous Models: The Basu-Lindsay Approach

    Full text link
    Robust inference based on the minimization of statistical divergences has proved to be a useful alternative to the classical maximum likelihood based techniques. Recently Ghosh et al. (2013) proposed a general class of divergence measures for robust statistical inference, named the S-Divergence Family. Ghosh (2014) discussed its asymptotic properties for the discrete model of densities. In the present paper, we develop the asymptotic properties of the proposed minimum S-Divergence estimators under continuous models. Here we use the Basu-Lindsay approach (1994) of smoothing the model densities that, unlike previous approaches, avoids much of the complications of the kernel bandwidth selection. Illustrations are presented to support the performance of the resulting estimators both in terms of efficiency and robustness through extensive simulation studies and real data examples.Comment: Pre-Print, 34 page
    • …
    corecore